3,445 research outputs found

    TPC-H Analyzed: Hidden Messages and Lessons Learned from an Influential Benchmark

    Get PDF
    The TPC-D benchmark was developed almost 20 years ago, and even though its current existence as TPC H could be considered superseded by TPC-DS, one can still learn from it. We focus on the technical level, summarizing the challenges posed by the TPC-H workload as we now understand them, which w

    Deterministic models of quantum fields

    Full text link
    Deterministic dynamical models are discussed which can be described in quantum mechanical terms. -- In particular, a local quantum field theory is presented which is a supersymmetric classical model. The Hilbert space approach of Koopman and von Neumann is used to study the classical evolution of an ensemble of such systems. Its Liouville operator is decomposed into two contributions, with positive and negative spectrum, respectively. The unstable negative part is eliminated by a constraint on physical states, which is invariant under the Hamiltonian flow. Thus, choosing suitable variables, the classical Liouville equation becomes a functional Schroedinger equation of a genuine quantum field theory. -- We briefly mention an U(1) gauge theory with ``varying alpha'' or dilaton coupling where a corresponding quantized theory emerges in the phase space approach. It is energy-parity symmetric and, therefore, a prototype of a model in which the cosmological constant is protected by a symmetry.Comment: 6 pages; synopsis of hep-th/0510267, hep-th/0503069, hep-th/0411176 . Talk at Constrained Dynamics and Quantum Gravity - QG05, Cala Gonone (Sardinia, Italy), September 12-16, 2005. To appear in the proceeding

    Biorthogonal Quantum Systems

    Full text link
    Models of PT symmetric quantum mechanics provide examples of biorthogonal quantum systems. The latter incorporporate all the structure of PT symmetric models, and allow for generalizations, especially in situations where the PT construction of the dual space fails. The formalism is illustrated by a few exact results for models of the form H=(p+\nu)^2+\sum_{k>0}\mu_{k}exp(ikx). In some non-trivial cases, equivalent hermitian theories are obtained and shown to be very simple: They are just free (chiral) particles. Field theory extensions are briefly considered.Comment: 34 pages, 5 eps figures; references added and other changes made to conform to journal versio

    Depth, distribution, and density of CO2 deposition on Mars

    Get PDF
    Observations by the Mars Orbiter Laser Altimeter have been used to detect subtle changes of the polar surface height during the course of seasonal cycles that correlate with the expected pattern of CO2 deposition and sublimation. Using altimetric crossover residuals from the Mars Orbiter Laser Altimeter, we show that while zonally averaged data capture the global behavior of CO2 exchange, there is a dependence of the pattern on longitude. At the highest latitudes the surface height change is as high as 1.5–2 m peak to peak, and it decreases equatorward. Decomposition of the signal into harmonics in time allows inspection of the spatial pattern and shows that the annual component is strongly correlated with the residual south polar cap deposits and, to a lesser extent, with the north polar cap. In the north, the second harmonic (semiannual) component correlates with the location of the ice deposits. The phases of the annual cycles are in agreement with observations by the Thermal Emission Spectrometer of the timing of the annual disappearance of CO2 frost from the surface at the high latitudes. At lower latitudes, frost sublimation (“Crocus date”) predates the mean depositional minima, as expected. These global-scale, volumetric measurements of the distribution of condensed CO2 can be combined with measurements of the deposited column mass density derived from the Neutron Spectrometer on board Mars Odyssey to yield an estimate of the density of the seasonally exchanging material of 0.5 ± 0.1 g/cm^3. These constraints should be considered in models of the Martian climate system and volatile cycles

    Distributed top-k aggregation queries at large

    Get PDF
    Top-k query processing is a fundamental building block for efficient ranking in a large number of applications. Efficiency is a central issue, especially for distributed settings, when the data is spread across different nodes in a network. This paper introduces novel optimization methods for top-k aggregation queries in such distributed environments. The optimizations can be applied to all algorithms that fall into the frameworks of the prior TPUT and KLEE methods. The optimizations address three degrees of freedom: 1) hierarchically grouping input lists into top-k operator trees and optimizing the tree structure, 2) computing data-adaptive scan depths for different input sources, and 3) data-adaptive sampling of a small subset of input sources in scenarios with hundreds or thousands of query-relevant network nodes. All optimizations are based on a statistical cost model that utilizes local synopses, e.g., in the form of histograms, efficiently computed convolutions, and estimators based on order statistics. The paper presents comprehensive experiments, with three different real-life datasets and using the ns-2 network simulator for a packet-level simulation of a large Internet-style network

    Estimating cardinalities with deep sketches

    Get PDF
    We introduce Deep Sketches, which are compact models of databases that allow us to estimate the result sizes of SQL queries. Deep Sketches are powered by a new deep learning approach to cardinality estimation that can capture correlations between columns, even across tables. Our demonstration allows users to define such sketches on the TPC-H and IMDb datasets, monitor the training process, and run ad-hoc queries against trained sketches. We also estimate query cardinalities with HyPer and PostgreSQL to visualize the gains over traditional cardinality estimators

    Optimizing the decoy-state BB84 QKD protocol parameters

    Get PDF
    Quantum key distribution (QKD) protocols allow for information theoretically secure distribution of (classical) cryptographic key material. However, due to practical limitations the performance of QKD implementations is somewhat restricted. For this reason, it is crucial to find optimal protocol parameters, while guaranteeing information theoretic security. The performance of a QKD implementation is determined by the tightness of the underlying security analysis. In particular, the security analyses determines the key-rate, i.e., the amount of cryptographic key material that can be distributed per time unit. Nowadays, the security analyses of various QKD protocols are well understood. It is known that optimal protocol parameters, such as the number of decoy states and their intensities, can be found by solving a nonlinear optimization problem. The complexity of this optimization problem is typically handled by making a number of heuristic assumptions. For instance, the number of decoy states is restricted to only one or two, with one of the decoy intensities set to a fixed value, and vacuum states are ignored as they are assumed to contribute only marginally to the secure key-rate. These assumptions simplify the optimization problem and reduce the size of search space significantly. However, they also cause the security analysis to be non-tight, and thereby result in sub-optimal performance. In this work, we follow a more rigorous approach using both linear and nonlinear programs describing the optimization problem. Our approach, focusing on the decoy-state BB84 protocol, allows heuristic assumptions to be omitted, and therefore results in a tighter security analysis with better protocol parameters. We show an improved performance for the decoy-state BB84 QKD protocol, demonstrating that the heuristic assumptions typically made are too restrictive. Moreover, our improved optimization frameworks shows that the complexity of the performance optimization problem can also be handled without making heuristic assumptions, even with limited computational resources available
    • 

    corecore